V-Invariant Methods for Generalised Least Squares Problems

نویسنده

  • Michael R. Osborne
چکیده

An important consideration in generalised least squares problems is that the dimension of the covariance matrix V is the dimension of the data set and is large when the data set is large. Also, the problem solution can be well determined in cases where V is illconditioned or singular. Here aspects of a class of methods which factorize the design matrix while leaving V invariant, and which can be expected to be well behaved exactly when the original problem solution is well behaved, are considered. Implementation is most satisfactory when V is diagonal. This can be achieved by a preprocessing step in which V is replaced by the diagonal matrix D which results from the modified Cholesky factorization P V P T → LDL T where L is unit lower triangular and P is the permutation matrix associated with diagonal pivoting. Conditions under which this replacement is satisfactory are investigated.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least-squares Proper Generalised Decompositions for Elliptic Systems

Proper generalised decompositions (PGDs) are a family of methods for efficiently solving high-dimensional PDEs. Convergence of PGD algorithms can be proven provided that the weak form of the PDE can be recast as the minimisation of some energy functional. A large number of elliptic problems, such as the Stokes problem, cannot be guaranteed to converge when employing a Galerkin PGD. Least-square...

متن کامل

Least squares methods in maximum likelihood problems

It is well known that the GaussNewton algorithm for solving nonlinear least squares problems is a special case of the scoring algorithm for maximizing log likelihoods. What has received less attention is that the computation of the current correction in the scoring algorithm in both its line search and trust region forms can be cast as a linear least squares problem. This is an important observ...

متن کامل

A New Feature Based Image Registration Algorithm

This paper introduces a new featurebased image registration algorithm which registers images by finding rotation and scale invariant features and matches them using an evidence accumulation process based on the Generalized Hough Transform. Once feature correspondence has been established, the transformation parameters are then estimated using Non-linear least squares (NLLS) and the standard RAN...

متن کامل

Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis

We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...

متن کامل

A Least Squares Approach to Estimating the Average Reservoir Pressure

Least squares method (LSM) is an accurate and rapid method for solving some analytical and numerical problems. This method can be used to estimate the average reservoir pressure in well test analysis. In fact, it may be employed to estimate parameters such as permeability (k) and pore volume (Vp). Regarding this point, buildup, drawdown, late transient test data, modified Muskat method, interfe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003